Binary output layer of feedforward neural networks for solving multi-class classification problems

نویسندگان

  • Sibo Yang
  • Chao Zhang
  • Wei Wu
چکیده

Considered in this short note is the design of output layer nodes of feedforward neural networks for solving multi-class classification problems with r (r ≥ 3) classes of samples. The common and conventional setting of output layer, called “one-to-one approach” in this paper, is as follows: The output layer contains r output nodes corresponding to the r classes. And for an input sample of the i-th class (1 ≤ i ≤ r), the ideal output is 1 for the i-th output node, and 0 for all the other output nodes. We propose in this paper a new “binary approach”: Suppose 2q−1 < r ≤ 2q with q ≥ 2, then we let the output layer contain q output nodes, and let the ideal outputs for the r classes be designed in a binary manner. Numerical experiments carried out in this paper show that our binary approach does equally good job as, but uses less output nodes than, the traditional one-to-one approach.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Learning with Binary Neurons

A efficient incremental learning algorithm for classification tasks, called NetLines, well adapted for both binary and real-valued input patterns is presented. It generates small compact feedforward neural networks with one hidden layer of binary units and binary output units. A convergence theorem ensures that solutions with a finite number of hidden units exist for both binary and real-valued...

متن کامل

Adaptive Leader-Following and Leaderless Consensus of a Class of Nonlinear Systems Using Neural Networks

This paper deals with leader-following and leaderless consensus problems of high-order multi-input/multi-output (MIMO) multi-agent systems with unknown nonlinear dynamics in the presence of uncertain external disturbances. The agents may have different dynamics and communicate together under a directed graph. A distributed adaptive method is designed for both cases. The structures of the contro...

متن کامل

Hardness Results for General Two-Layer Neural Networks

We deal with the problem of learning a general class of 2-layer neural networks in polynomial time. The considered neural networks consist of k linear threshold units on the hidden layer and an arbitrary binary output unit. We show NP-completeness of the consistency problem for classes that use an arbitrary set of binary output units containing a function which depends on all input dimensions. ...

متن کامل

Output Reachable Set Estimation and Verification for Multi-Layer Neural Networks

In this paper, the output reachable estimation and safety verification problems for multi-layer perceptron neural networks are addressed. First, a conception called maximum sensitivity in introduced and, for a class of multi-layer perceptrons whose activation functions are monotonic functions, the maximum sensitivity can be computed via solving convex optimization problems. Then, using a simula...

متن کامل

Supervised self-coding in multilayered feedforward networks

Supervised neural-network learning algorithms have proven very successful at solving a variety of learning problems. However, they suffer from a common problem of requiring explicit output labels. This requirement makes such algorithms implausible as biological models. In this paper, it is shown that pattern classification can be achieved, in a multilayered feedforward neural network, without r...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1801.07599  شماره 

صفحات  -

تاریخ انتشار 2018